23 Convergence
To explain the convergence of random variables, we will first review the definition of convergence for the sequence of real numbers.
Definition 23.1 A sequence of real numbers \{ a_{i} \}_{1}^{n}, a_{i} \in \mathbb{R} converges to a limit a \in \mathbb{R} if the distance between a and any term in the subsequence \{ a_{i} \}_{n_{0}}^{n} is arbitrarily small,
\forall \epsilon > 0, \exist n_{0} \in \mathbb{N}: d (a_{n}, a) \leq \epsilon, \forall n > n_{0},
which is often expressed as
a = \lim_{n \to \infty} a_{n}.
The concept of the convergence for sequences of random variables is similar, but the difference is about the definition of the sequence of random variables and the way to measure distance between two random variables.
Sequence of random variables
Definition 23.2 (Sequence of random variables) A sequence of random variables
\{ X_{i} \}_{1}^{n} = X_{1}, \dots, X_{n}
is the set of n random variables that are defined on the same sample space \Omega, which means each random variable X_{i} in the sequence is a function from \Omega to \mathbb{R}.
Usually, we assume that distributions of the sequence of random variables in a sequence are dependent on their indices i.
Convergence of random variables
Convergence in distribution
The distance function in the definition of convergence in distribution is the difference in CDFs of random variables, and the convergence in distribution occurs when the this difference goes to zero at all points.
Definition 23.3 (Convergence in distribution) A sequence of random variables \{ X_{i} \}_{1}^{n} converge in distribution to the random variable X if
\lim_{n \to \infty} F_{X_{n}} (x) = F_{X} (x)
for every number x \in \mathbb{R} at which F_{X} is continuous.
Convergence in probability
The distance function in the definition of convergence in probability is the probability of the absolute difference between random variables.
Definition 23.4 (Convergence in probability) A sequence of random variables \{ X_{i} \}_{1}^{n} converge in probability to the random variable X if
\lim_{n \to \infty} \mathbb{P} (\lvert X_{n} - X \rvert > \epsilon) = 0.
Corollary 23.1 The convergence in probability implies the convergence in distribution, which means convergence in probability is a stronger version of the convergence than the convergence in distribution.
Almost sure convergence
A sequence of random variables \{ X_{i} \}_{1}^{n} converge almost surely to the random variable X if the values of X_{n} approach the value of X in the sense that events for which X_{n} does not converge to X have probability 0,
\mathbb{P} (\omega \in \Omega: \lim_{n \to \infty} X_{n} (\omega) = X (\omega)) = 1.
Corollary 23.2 The almost sure convergence implies the convergence in probability, which means almost sure convergence is a even stronger version of the convergence than the convergence in probability and the convergence in distribution.